Bayesian Optimization Algorithm Decision Graphs and Occam s Razor

نویسندگان

  • Martin Pelikan
  • David E Goldberg
  • Kumara Sastry
چکیده

This paper discusses the use of various scoring metrics in the Bayesian optimization algorithm BOA which uses Bayesian networks to model promising solutions and generate the new ones The use of decision graphs in Bayesian networks to improve the performance of the BOA is proposed To favor simple models a complexity measure is incorporated into the Bayesian Dirichlet metric for Bayesian networks with decision graphs The presented algorithms are compared on a number of interesting problems

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Bayesian “occam Factors” Argument for Occam’s Razor

This paper discusses some of the problematic aspects of the Bayesian first-principles “proof” of Occam’s razor which involves Occam factors. Although it is true that the posterior for a model is reduced due to Occam factors if that model is capable of expressing many functions, the phenomenon need not have anything to do with Occam’s razor. This paper shows this by i) performing reductio ad abs...

متن کامل

A Model Selection Criterion for Classification: Application to HMM Topology Optimization

This paper proposes a model selection criterion for classification problems. The criterion focuses on selecting models that are discriminant instead of models based on the Occam’s razor principle of parsimony between accurate modeling and complexity. The criterion, dubbed Discriminative Information Criterion (DIC), is applied to the optimization of Hidden Markov Model topology aimed at the reco...

متن کامل

Learning as Optimization

This dissertation is concerned with inductive learning from examples, and the reduction of learning problems to associated optimization problems. The emphasis is on learning to classify. Theoretical results include (1) examining the application of Occam’s Razor in a general learning setting; (2) investigating two optimization problems associated with learning using linear threshold functions, a...

متن کامل

Particle swarm optimisation assisted classification using elastic net prefiltering

A novel two-stage construction algorithm for linear-in-the-parameters classifier is proposed, aiming at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage to construct a sparse linear-in-the-parameters classifier. For the first stage learning of generating the prefiltered signal, a tw...

متن کامل

Extending Occam's Razor

Occam's Razor states that, all other things being equal, the simpler of two possible hypotheses is to be preferred. A quanti ed version of Occam's Razor has been proven for the PAC model of learning, giving sample-complexity bounds for learning using what Blumer et al. call an Occam algorithm [1]. We prove an analog of this result for Haussler's more general learning model, which encompasses le...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000